Goto

Collaborating Authors

 unknown structure


Oracle-Efficient Regret Minimization in Factored MDPs with Unknown Structure

Neural Information Processing Systems

We study regret minimization in non-episodic factored Markov decision processes (FMDPs), where all existing algorithms make the strong assumption that the factored structure of the FMDP is known to the learner in advance. In this paper, we provide the first algorithm that learns the structure of the FMDP while minimizing the regret. Our algorithm is based on the optimism in face of uncertainty principle, combined with a simple statistical method for structure learning, and can be implemented efficiently given oracle-access to an FMDP planner. Moreover, we give a variant of our algorithm that remains efficient even when the oracle is limited to non-factored actions, which is the case with almost all existing approximate planners. Finally, we leverage our techniques to prove a novel lower bound for the known structure case, closing the gap to the regret bound of Chen et al. [2021].


Oracle-Efficient Regret Minimization in Factored MDPs with Unknown Structure

Neural Information Processing Systems

We study regret minimization in non-episodic factored Markov decision processes (FMDPs), where all existing algorithms make the strong assumption that the factored structure of the FMDP is known to the learner in advance. In this paper, we provide the first algorithm that learns the structure of the FMDP while minimizing the regret. Our algorithm is based on the optimism in face of uncertainty principle, combined with a simple statistical method for structure learning, and can be implemented efficiently given oracle-access to an FMDP planner. Moreover, we give a variant of our algorithm that remains efficient even when the oracle is limited to non-factored actions, which is the case with almost all existing approximate planners. Finally, we leverage our techniques to prove a novel lower bound for the known structure case, closing the gap to the regret bound of Chen et al. [2021].


A Hybrid Supervised and Self-Supervised Graph Neural Network for Edge-Centric Applications

Borzone, Eugenio, Di Persia, Leandro, Gerard, Matias

arXiv.org Artificial Intelligence

This paper presents a novel graph-based deep learning model for tasks involving relations between two nodes (edge-centric tasks), where the focus lies on predicting relationships and interactions between pairs of nodes rather than node properties themselves. This model combines supervised and self-supervised learning, taking into account for the loss function the embeddings learned and patterns with and without ground truth. Additionally it incorporates an attention mechanism that leverages both node and edge features. The architecture, trained end-to-end, comprises two primary components: embedding generation and prediction. First, a graph neural network (GNN) transform raw node features into dense, low-dimensional embeddings, incorporating edge attributes. Then, a feedforward neural model processes the node embeddings to produce the final output. Experiments demonstrate that our model matches or exceeds existing methods for protein-protein interactions prediction and Gene Ontology (GO) terms prediction. The model also performs effectively with one-hot encoding for node features, providing a solution for the previously unsolved problem of predicting similarity between compounds with unknown structures.


Artificial intelligence for template-free protein structure prediction: a comprehensive review - Artificial Intelligence Review

#artificialintelligence

Protein structure prediction (PSP) is a grand challenge in bioinformatics, drug discovery, and related fields. PSP is computationally challenging because of an astronomically large conformational space to be searched and an unknown very complex energy function to be minimised. To obtain a given protein's structure, template-based PSP approaches adopt a similar protein's known structure, while template-free PSP approaches work when no similar protein's structure is known. Currently, proteins with known structures are greatly outnumbered by proteins with unknown structures. Template-free PSP has obtained significant progress recently via machine learning and search-based optimisation approaches.


Scientists make first detection of exotic "X" particles in quark-gluon plasma

#artificialintelligence

In the first millionths of a second after the Big Bang, the universe was a roiling, trillion-degree plasma of quarks and gluons -- elementary particles that briefly glommed together in countless combinations before cooling and settling into more stable configurations to make the neutrons and protons of ordinary matter. In the chaos before cooling, a fraction of these quarks and gluons collided randomly to form short-lived "X" particles, so named for their mysterious, unknown structures. Today, X particles are extremely rare, though physicists have theorized that they may be created in particle accelerators through quark coalescence, where high-energy collisions can generate similar flashes of quark-gluon plasma. Now physicists at MIT's Laboratory for Nuclear Science and elsewhere have found evidence of X particles in the quark-gluon plasma produced in the Large Hadron Collider (LHC) at CERN, the European Organization for Nuclear Research, based near Geneva, Switzerland. The team used machine-learning techniques to sift through more than 13 billion heavy ion collisions, each of which produced tens of thousands of charged particles.


Pinaki Laskar on LinkedIn: #AGI #AI #machinelearning

#artificialintelligence

AI Researcher, Cognitive Technologist Inventor - AI Thinking, Think Chain Innovator - AIOT, XAI, Autonomous Cars, IIOT Founder Fisheyebox Spatial Computing Savant, Transformative Leader, Industry X.0 Practitioner The detection of structures is based on a set of potential invariants that reflect the properties of the environment (space). An invariant is essentially a hypothesis that needs to be tested and confirmed or rejected. For the natural environment, the set of invariants reflects the properties of space-time and does not depend on the purpose of the system. A fixed set of invariants allows detecting previously unknown structures of arbitrary complexity. Detecting a previously unknown structure is essentially the construction of new concepts.


Machine Learning on DARWIN Datasets (MLD-I) Darwinex Blog

#artificialintelligence

Machine learning in essence, is the research and application of algorithms that help us better understand data. By leveraging statistical learning techniques from the realm of machine learning, practitioners are able to draw meaningful inferences from and turn data into actionable intelligence. Furthermore, the availability of several open source machine learning tools, platforms and libraries today enables absolutely anyone to break into this field, utilizing a plethora of powerful algorithms to discover exploitable patterns in data and predict future outcomes. This development in particular has given rise to a new wave of DIY retail traders, creating sophisticated trading strategies that compete (and in some cases, outperform others) in a space previously dominated by just institutional participants. In this introductory blog post, we will discuss supportive reasoning for, and different categories of machine learning.